All Questions
Tagged with linear-regressionlinear-algebra
17 questions
4votes
2answers
126views
What type of technique can be used to solve this question?
Apology for the ambiguous title, I do not know the term. I have data of some products which a few variables: origin, weight, brand. For example: Product A = "China, 100g, Brand X" Product B ...
0votes
1answer
41views
What (in the world) is well-conditioned vs. low rank fat-tail singular profile?
Scikit learn has a make_regression data generator. Can someone explain it to me like I'm 5 what is meant in the help docs by "The input set can either be well ...
3votes
0answers
296views
Multi-dimensional Euclidian R^2 squared - reasonable?
I have a high-dimensional space, say $\mathbb{R}^{1000}$, and I have samples $y_1, \ldots , y_n \in \mathbb{R}^{1000}$ and $\hat{y}_1, \ldots , \hat{y}_n \in \mathbb{R}^{1000}$. Would $$ R^2 = 1 - \...
1vote
1answer
980views
How does the equation "dW = - (2 * (X^T ).dot(Y - Y_hat)) / m" comes in Linear Regression (using Matrix + Gradient Descent)?
I was trying to code the Linear Regression in Python using Matrix Multiplication method using Gradient Descent and followed a code where there was no mention what is the loss but just a code as Per ...
1vote
1answer
2kviews
Dot product and linear regression
I'm studying PCA and my professor said something about finding the linear regression by doing the dot product of both axis. Could someone explain to me why? The dot product returns a number. What's ...
2votes
0answers
139views
Deriving vectorized form of linear regression
We first have the weights of a D dimensional vector $w$ and a D dimensional predictor vector $x$, which are all indexed by $j$. There are $N$ observations, all D dimensional. $t$ is our targets, i.e, ...
1vote
2answers
8kviews
Dose finding slope/intercept using the formula of m,b gives best fit line always In linear regression?
In liner regression We have to fit different lines and chose one with minimum error so What is the motive of having a formula for m,b that can give slope and intercept value in the regression line ,...
0votes
1answer
139views
Normal equation for linear regression is illogical
Currently I'm taking Andrew Ng's course. He gives a following formula to find solution for linear regression analytically: $θ = (X^T * X)^{-1} * X^T * у$ He doesn't explain it so I searched for it and ...
1vote
3answers
700views
Why transpose of independent feature matrix is necessary in case of linear regression?
I can follow classical linear regression steps: $Xw=y$ $X^{-1}Xw=X^{-1}y$ $Iw=X^{-1}y$ $w=X^{-1}y$ However, on implementing in Python, I see that instead of simply using ...
1vote
2answers
1kviews
Gradient descent formula implementation in python
So I recently started with Andrew Ng's ML Course and this is the formula that Andrew lays out for calculating gradient descent on a linear model. $$ \theta_j = \theta_j - \alpha \frac{1}{m} \sum_{i=1}...
-1votes
1answer
54views
I can't understand polynomial in the book
I'm reading a book called Bishop - Pattern Recognition and Machine learning. I came across the following equation, in which I don't understand what $W$ stands for. So, what is $W$?
1vote
1answer
1kviews
Linear regression with white Gaussian noise
I am new to machine learning , so this question may sound fundamental. My task is to estimate the parameter vector of the equation with the least squares method: $y = \theta_0 + \theta_1x + \theta_2x^...
1vote
1answer
147views
Optimizing vector values for maximum correlation
I'm new to ML, linear algebra, statistics, etc. so bear with me on the terminology... I’m looking to find a vector that produces the maximum correlation for the relationship between 1) all ...
0votes
1answer
85views
Can I use regression to solve a multiple equation problem
I'm working on a problem which is a multiple equation. I have a group of people and each person in the group is working on different tasks (e.g. n tasks in total). Each person in this group is working ...
1vote
1answer
68views
Least Squares Regression $Ax=b$ when $A$ is fixed and $b$ is varied
The typical setting for least squares regression (or over-determined linear system) for $Ax=b$ is to solve $x$ given $A$ and $b$. In other words, $A$ and $b$ are fixed when we solve the problem. My ...